equal protection
Causal Equal Protection as Algorithmic Fairness
Di Bello, Marcello, Cangiotti, Nicolò, Loi, Michele
Over the last ten years the literature in computer science and philosophy has formulated different criteria of algorithmic fairness. One of the most discussed, classification parity, requires that the erroneous classifications of a predictive algorithm occur with equal frequency for groups picked out by protected characteristics. Despite its intuitive appeal, classification parity has come under attack. Multiple scenarios can be imagined in which - intuitively - a predictive algorithm does not treat any individual unfairly, and yet classification parity is violated. To make progress, we turn to a related principle, equal protection, originally developed in the context of criminal justice. Key to equal protection is equalizing the risks of erroneous classifications (in a sense to be specified) as opposed to equalizing the rates of erroneous classifications. We show that equal protection avoids many of the counterexamples to classification parity, but also fails to model our moral intuitions in a number of common scenarios, for example, when the predictor is causally downstream relative to the protected characteristic. To address these difficulties, we defend a novel principle, causal equal protection, that models the fair allocation of the risks of erroneous classification through the lenses of causality.
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > Arizona > Maricopa County > Tempe (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- (2 more...)
A Legal Approach to "Affirmative Algorithms"
Solutions to fix algorithmic bias could collide with law. Two scholars propose a solution. Proposed solutions to fix algorithmic bias could conflict with Supreme Court rulings on equal protection, legal scholars note. As AI and predictive algorithms permeate ever more areas of decision making, from setting bail to evaluating job applications to making home loans, what happens when an algorithm arbitrarily discriminates against women, African-Americans, or other groups? It happens all the time.
- North America > United States > California > Santa Clara County > Palo Alto (0.40)
- North America > United States > Michigan (0.06)
- North America > United States > Connecticut > New Haven County > New Haven (0.05)